1,624 research outputs found

    Multiple-objective sensor management and optimisation

    No full text
    One of the key challenges associated with exploiting modern Autonomous Vehicle technology for military surveillance tasks is the development of Sensor Management strategies which maximise the performance of the on-board Data-Fusion systems. The focus of this thesis is the development of Sensor Management algorithms which aim to optimise target tracking processes. Three principal theoretical and analytical contributions are presented which are related to the manner in which such problems are formulated and subsequently solved.Firstly, the trade-offs between optimising target tracking and other system-level objectives relating to expected operating lifetime are explored in an autonomous ground sensor scenario. This is achieved by modelling the observer trajectory control design as a probabilistic, information-theoretic, multiple-objective optimisation problem. This novel approach explores the relationships between the changes in sensor-target geometry that are induced by tracking performance measures and those relating to power consumption. This culminates in a novel observer trajectory control algorithm based onthe minimax approach.The second contribution is an analysis of the propagation of error through a limited-lookahead sensor control feedback loop. In the last decade, it has been shown that the use of such non-myopic (multiple-step) planning strategies can lead to superior performance in many Sensor Management scenarios. However, relatively little is known about the performance of strategies which use different horizon lengths. It is shown that, in the general case, planning performance is a function of the length of the horizon over which the optimisation is performed. While increasing the horizon maximises the chances of achieving global optimality, by revealing information about the substructureof the decision space, it also increases the impact of any prediction error, approximations, or unforeseen risk present within the scenario. These competing mechanisms aredemonstrated using an example tracking problem. This provides the motivation for a novel sensor control methodology that employs an adaptive length optimisation horizon. A route to selecting the optimal horizon size is proposed, based on a new non-myopic risk equilibrium which identifies the point where the two competing mechanisms are balanced.The third area of contribution concerns the development of a number of novel optimisation algorithms aimed at solving the resulting sequential decision making problems. These problems are typically solved using stochastic search methods such as Genetic Algorithms or Simulated Annealing. The techniques presented in this thesis are extensions of the recently proposed Repeated Weighted Boosting Search algorithm. In its originalform, it is only applicable to continuous, single-objective, ptimisation problems. The extensions facilitate application to mixed search spaces and Pareto multiple-objective problems. The resulting algorithms have performance comparable with Genetic Algorithm variants, and offer a number of advantages such as ease of implementation and limited tuning requirements

    The XMM-Newton serendipitous survey. VII. The third XMM-Newton serendipitous source catalogue

    Get PDF
    Thanks to the large collecting area (3 x ~1500 cm2^2 at 1.5 keV) and wide field of view (30' across in full field mode) of the X-ray cameras on board the European Space Agency X-ray observatory XMM-Newton, each individual pointing can result in the detection of hundreds of X-ray sources, most of which are newly discovered. Recently, many improvements in the XMM-Newton data reduction algorithms have been made. These include enhanced source characterisation and reduced spurious source detections, refined astrometric precision, greater net sensitivity and the extraction of spectra and time series for fainter sources, with better signal-to-noise. Further, almost 50\% more observations are in the public domain compared to 2XMMi-DR3, allowing the XMM-Newton Survey Science Centre (XMM-SSC) to produce a much larger and better quality X-ray source catalogue. The XMM-SSC has developed a pipeline to reduce the XMM-Newton data automatically and using improved calibration a new catalogue version has been produced from XMM-Newton data made public by 2013 Dec. 31 (13 years of data). Manual screening ensures the highest data quality. This catalogue is known as 3XMM. In the latest release, 3XMM-DR5, there are 565962 X-ray detections comprising 396910 unique X-ray sources. For the 133000 brightest sources, spectra and lightcurves are provided. For all detections, the positions on the sky, a measure of the quality of the detection, and an evaluation of the X-ray variability is provided, along with the fluxes and count rates in 7 X-ray energy bands, the total 0.2-12 keV band counts, and four hardness ratios. To identify the detections, a cross correlation with 228 catalogues is also provided for each X-ray detection. 3XMM-DR5 is the largest X-ray source catalogue ever produced. Thanks to the large array of data products, it is an excellent resource in which to find new and extreme objects.Comment: 23 pages, version accepted for publication in A&

    A Century of Cosmology

    Full text link
    In the century since Einstein's anno mirabilis of 1905, our concept of the Universe has expanded from Kapteyn's flattened disk of stars only 10 kpc across to an observed horizon about 30 Gpc across that is only a tiny fraction of an immensely large inflated bubble. The expansion of our knowledge about the Universe, both in the types of data and the sheer quantity of data, has been just as dramatic. This talk will summarize this century of progress and our current understanding of the cosmos.Comment: Talk presented at the "Relativistic Astrophysics and Cosmology - Einstein's Legacy" meeting in Munich, Nov 2005. Proceedings will be published in the Springer-Verlag "ESO Astrophysics Symposia" series. 10 pages Latex with 2 figure

    Attitudes of developing world physicians to where medical research is performed and reported

    Get PDF
    Background: Little is known about the influence of the site of research or publication on the impact of the research findings on clinical practice, particularly in developing countries. The International Clinical Epidemiology Network (INCLEN) is dedicated to improving the quality of health research in the Developing World through institutional capacity building for evidence based medicine, and provided the opportunity to examine the likely impact of research location and journal location on physicians\u27 practice in a number of the participating countries. Methods: Physicians from secondary and tertiary hospitals in six cities located in China, Thailand, India, Egypt and Kenya were enrolled in a cross-sectional questionnaire survey. The primary outcome measures were scores on a Likert scale reflecting stated likelihood of changing clinical practice depending on the source of the research or its publication. Results: Overall, local research and publications were most likely to effect change in clinical practice, followed by North American, European and regional research/publications respectively, although there were significant variations between countries. The impact of local and regional research would be greater if the perceived research quality improved in those settings. Conclusion: Conducting high quality local research is likely to be an effective way of getting research findings into practice in developing countries

    High Energy Colliders as Black Hole Factories: The End of Short Distance Physics

    Get PDF
    If the fundamental Planck scale is of order a TeV, as the case in some extra-dimensions scenarios, future hadron colliders such as the Large Hadron Collider will be black hole factories. The non-perturbative process of black hole formation and decay by Hawking evaporation gives rise to spectacular events with up to many dozens of relatively hard jets and leptons, with a characteristic ratio of hadronic to leptonic activity of roughly 5:1. The total transverse energy of such events is typically a sizeable fraction of the beam energy. Perturbative hard scattering processes at energies well above the Planck scale are cloaked behind a horizon, thus limiting the ability to probe short distances. The high energy black hole cross section grows with energy at a rate determined by the dimensionality and geometry of the extra dimensions. This dependence therefore probes the extra dimensions at distances larger than the Planck scale.Comment: Latex, 28 pages. v4: minor changes, largely to agree with published version; appendix added comparing convention

    Panchromatic spectral energy distributions of Herschel sources

    Get PDF
    (abridged) Far-infrared Herschel photometry from the PEP and HerMES programs is combined with ancillary datasets in the GOODS-N, GOODS-S, and COSMOS fields. Based on this rich dataset, we reproduce the restframe UV to FIR ten-colors distribution of galaxies using a superposition of multi-variate Gaussian modes. The median SED of each mode is then fitted with a modified version of the MAGPHYS code that combines stellar light, emission from dust heated by stars and a possible warm dust contribution heated by an AGN. The defined Gaussian grouping is also used to identify rare sources. The zoology of outliers includes Herschel-detected ellipticals, very blue z~1 Ly-break galaxies, quiescent spirals, and torus-dominated AGN with star formation. Out of these groups and outliers, a new template library is assembled, consisting of 32 SEDs describing the intrinsic scatter in the restframe UV-to-submm colors of infrared galaxies. This library is tested against L(IR) estimates with and without Herschel data included, and compared to eight other popular methods often adopted in the literature. When implementing Herschel photometry, these approaches produce L(IR) values consistent with each other within a median absolute deviation of 10-20%, the scatter being dominated more by fine tuning of the codes, rather than by the choice of SED templates. Finally, the library is used to classify 24 micron detected sources in PEP GOODS fields. AGN appear to be distributed in the stellar mass (M*) vs. star formation rate (SFR) space along with all other galaxies, regardless of the amount of infrared luminosity they are powering, with the tendency to lie on the high SFR side of the "main sequence". The incidence of warmer star-forming sources grows for objects with higher specific star formation rates (sSFR), and they tend to populate the "off-sequence" region of the M*-SFR-z space.Comment: Accepted for publication in A&A. Some figures are presented in low resolution. The new galaxy templates are available for download at the address http://www.mpe.mpg.de/ir/Research/PEP/uvfir_temp

    Radio source calibration for the VSA and other CMB instruments at around 30 GHz

    Get PDF
    Accurate calibration of data is essential for the current generation of CMB experiments. Using data from the Very Small Array (VSA), we describe procedures which will lead to an accuracy of 1 percent or better for experiments such as the VSA and CBI. Particular attention is paid to the stability of the receiver systems, the quality of the site and frequent observations of reference sources. At 30 GHz the careful correction for atmospheric emission and absorption is shown to be essential for achieving 1 percent precision. The sources for which a 1 percent relative flux density calibration was achieved included Cas A, Cyg A, Tau A and NGC7027 and the planets Venus, Jupiter and Saturn. A flux density, or brightness temperature in the case of the planets, was derived at 33 GHz relative to Jupiter which was adopted as the fundamental calibrator. A spectral index at ~30 GHz is given for each. Cas A,Tau A, NGC7027 and Venus were examined for variability. Cas A was found to be decreasing at 0.394±0.0190.394 \pm 0.019 percent per year over the period March 2001 to August 2004. In the same period Tau A was decreasing at 0.22±0.070.22\pm 0.07 percent per year. A survey of the published data showed that the planetary nebula NGC7027 decreased at 0.16±0.040.16\pm 0.04 percent per year over the period 1967 to 2003. Venus showed an insignificant (1.5±1.31.5 \pm 1.3 percent) variation with Venusian illumination. The integrated polarization of Tau A at 33 GHz was found to be 7.8±0.67.8\pm 0.6 percent at pa =148±3 = 148^\circ \pm 3^\circ.}Comment: 13 pages, 15 figures, submitted to MNRA

    HerMES: Current Cosmic Infrared Background Estimates Can be Explained by Known Galaxies and their Faint Companions at z < 4

    Get PDF
    We report contributions to cosmic infrared background (CIB) intensities originating from known galaxies and their faint companions at submillimeter wavelengths. Using the publicly-available UltraVISTA catalog, and maps at 250, 350, and 500 {\mu}m from the \emph{Herschel} Multi-tiered Extragalactic Survey (HerMES), we perform a novel measurement that exploits the fact that uncatalogued sources may bias stacked flux densities --- particularly if the resolution of the image is poor --- and intentionally smooth the images before stacking and summing intensities. By smoothing the maps we are capturing the contribution of faint (undetected in K_S ~ 23.4) sources that are physically associated, or correlated, with the detected sources. We find that the cumulative CIB increases with increased smoothing, reaching 9.82 +- 0.78, 5.77 +- 0.43, and 2.32 +- 0.19nWm2sr1\, \rm nW m^{-2} sr^{-1} at 250, 350, and 500 {\mu}m at 300 arcsec FWHM. This corresponds to a fraction of the fiducial CIB of 0.94 +- 0.23, 1.07 +- 0.31, and 0.97 +- 0.26 at 250, 350, and 500 {\mu}m, where the uncertainties are dominated by those of the absolute CIB. We then propose, with a simple model combining parametric descriptions for stacked flux densities and stellar mass functions, that emission from galaxies with log(M/Msun) > 8.5 can account for the most of the measured total intensities, and argue against contributions from extended, diffuse emission. Finally, we discuss prospects for future survey instruments to improve the estimates of the absolute CIB levels, and observe any potentially remaining emission at z > 4.Comment: Accepted to ApJL. 6 Pages, 3 figure

    Rebuilding relationships on coral reefs: Coral bleaching knowledge-sharing to aid adaptation planning for reef users: Bleaching emergence on reefs demonstrates the need to consider reef scale and accessibility when preparing for, and responding to, coral bleaching

    Get PDF
    Coral bleaching has impacted reefs worldwide and the predictions of near-annual bleaching from over two decades ago have now been realized. While technology currently provides the means to predict large-scale bleaching, predicting reef-scale and within-reef patterns in real-time for all reef users is limited. In 2020, heat stress across the Great Barrier Reef underpinned the region's third bleaching event in 5 years. Here we review the heterogeneous emergence of bleaching across Heron Island reef habitats and discuss the oceanographic drivers that underpinned variable bleaching emergence. We do so as a case study to highlight how reef end-user groups who engage with coral reefs in different ways require targeted guidance for how, and when, to alter their use of coral reefs in response to bleaching events. Our case study of coral bleaching emergence demonstrates how within-reef scale nowcasting of coral bleaching could aid the development of accessible and equitable bleaching response strategies on coral reefs. Also see the video abstract here: https://youtu.be/N9Tgb8N-vN0

    Priority sites for wildfowl conservation in Mexico

    Get PDF
    A set of priority sites for wildfowl conservation in Mexico was determined using contemporary count data (1991–2000) from the U.S. Fish & Wildlife Service mid-winter surveys. We used a complementarity approach implemented through linear integer programming that addresses particular conservation concerns for every species included in the analysis and large fluctuations in numbers through time. A set of 31 priority sites was identified, which held more than 69% of the mid-winter count total in Mexico during all surveyed years. Six sites were in the northern highlands, 12 in the central highlands, six on the Gulf of Mexico coast and seven on the upper Pacific coast. Twenty-two sites from the priority set have previously been identified as qualifying for designation as wetlands of international importance under the Ramsar Convention and 20 sites are classified as Important Areas for Bird Conservation in Mexico. The information presented here provides an accountable, spatially-explicit, numerical basis for ongoing conservation planning efforts in Mexico, which can be used to improve existing wildfowl conservation networks in the country and can also be useful for conservation planning exercises elsewhere
    corecore